Goto

Collaborating Authors

 fake nude image


Teen sues AI tool maker over fake nude images

FOX News

A 17-year-old's lawsuit against an AI clothes removal company highlights growing privacy concerns as fake nude images spread through schools and social media.


We're Completely Unprepared for the Deepfake Porn Boom

Slate

Last week, A.I.–generated nude images of pop superstar Taylor Swift were produced and distributed without her consent. They circulated throughout the internet, with one single post on X (née Twitter) garnering 45 million views before the site took it down. Deepfakes, as they've come to be called in recent years, often target female celebrities, but with the rise of A.I., it's easier than ever for everyday people (almost always women) to be targeted. Last year, more than 143,000 deepfake porn videos were created, according to one estimate from the independent researcher Genevieve Oh, more than every other previous year combined. That number will, in all likelihood, only continue to rise.


Disturbing app can create nude images of ANY woman

Daily Mail - Science & tech

A disturbing app has been developed which uses artificial intelligence and algorithms to produce fake nude images of women. The app, called DeepNude, removes all clothing from any uploaded image of a woman - sparking fears it could be used to blackmail unsuspecting victims with fake revenge porn threats. Since the app came to light, it has been taken offline, claiming it'cannot cope' with the volume of interest. The anonymous developers said they would be back within days and just needed'to fix some bugs and catch our breath'. In the free version of the app, the output images are partially covered with a large watermark.